DroNet: Learning to Fly by Driving
References
DroNet: Learning to Fly by Driving
IEEE Robotics and Automation Letters (RA-L), 2018.
PDF YouTube Software and DatasetsDownload
GitHub repository for the project.
This repository contains all the code we used to train and test DroNet. It also contains a detailed README that you can use to reproduce our results.
Collision dataset Available Here
This archive contains the labeled collision data that we used to let DroNet predict potentially dangerous situations.
DroNet Weights Available Here
This archive contains the trained weights of DroNet that we used in all our real world experiments.
Description
DroNet (IEEE RAL'18) is a convolutional neural network that can safely drive a drone through the streets of a city.
In unstructured and highly dynamic scenarios drones face numerous challenges to navigate autonomously in a feasible and safe way. Due to the danger that flying a drone can cause in an urban environment, collecting training data results impossible. For that reason, DroNet learns how to fly by imitating the behavior of manned vehicles that are already integrated in such environment. It produces a steering angle and a collision probability for the current input image captured by a forward-looking camera. Then, these high-level commands are transferred to control commands so that the drone keeps navigating, while avoiding obstacles.
DroNet is both versatile and efficient. First, it works on very different environments, both indoor and outdoor, without any initial knowledge about them. Indeed, with neither a map of the environment nor retraining or fine-tuning, our method generalizes to scenarios completely unseen at training time, including indoor corridors, parking lots, and high altitudes. Second, DroNet was designed to require very little computational resource compared to most of the existing deep convolutional networks. This allows real-time performance, even on a CPU.